10 research outputs found

    Verification and Validation of Semantic Annotations

    Full text link
    In this paper, we propose a framework to perform verification and validation of semantically annotated data. The annotations, extracted from websites, are verified against the schema.org vocabulary and Domain Specifications to ensure the syntactic correctness and completeness of the annotations. The Domain Specifications allow checking the compliance of annotations against corresponding domain-specific constraints. The validation mechanism will detect errors and inconsistencies between the content of the analyzed schema.org annotations and the content of the web pages where the annotations were found.Comment: Accepted for the A.P. Ershov Informatics Conference 2019(the PSI Conference Series, 12th edition) proceedin

    Metrics-Driven Framework for LOD Quality Assessment

    No full text

    NLP Data Cleansing Based on Linguistic Ontology Constraints

    No full text

    RDF Dataset Profiling

    No full text
    International audienceIn the context of this chapter, an RDF dataset is defined in accordance with the dataset definition in the Vocabulary of Interlinked Datasets (VoID), (http://vocab.deri.ie/void), namely, “A Dataset is a set of RDF triples that are published, maintained or aggregated by a single provider.” According to VoID, a dataset represents a meaningful collection of triples as envisioned by its provider. An RDF dataset profile is a formal representation of a set of dataset characteristics (features). It describes the dataset and aids dataset discovery, recommendation, and comparison with regard to the represented features. A dataset profile featureis a characteristic describing a certain attribute of the dataset. For instance, “dataset conciseness” is a dataset profile feature providing information on the degree of redundancy of the information contained in the dataset. A dataset profile is extensible with respect to the features it contains. Usually, the relevant feature set is..

    Big Data in the Public Sector. Linking Cities to Sensors

    No full text
    Part 5: Big and Open Linked DataInternational audienceIn the public sector, big data holds many promises for improving policy outcomes in terms of service delivery and decision-making and is starting to gain increased attention by governments. Cities are collecting large amounts of data from traditional sources such as registries and surveys and from non-traditional sources such as the Internet of Things, and are considered an important field of experimentation to generate public value with big data. The establishment of a city data infrastructure can drive such a development. This paper describes two key challenges for such an infrastructure: platform federation and data quality, and how these challenges are addressed in the ongoing research project CPaaS.io
    corecore